Statistical guarantees for regularized neural networks

نویسندگان

چکیده

Neural networks have become standard tools in the analysis of data, but they lack comprehensive mathematical theories. For example, there are very few statistical guarantees for learning neural from especially classes estimators that used practice or at least similar to such. In this paper, we develop a general guarantee consist least-squares term and regularizer. We then exemplify with ?1-regularization, showing corresponding prediction error increases most logarithmically total number parameters can even decrease layers. Our results establish basis regularized estimation networks, deepen our understanding deep more generally.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized EM Algorithms: A Unified Framework and Statistical Guarantees

Latent models are a fundamental modeling tool in machine learning applications, but they present significant computational and analytical challenges. The popular EM algorithm and its variants, is a much used algorithmic tool; yet our rigorous understanding of its performance is highly incomplete. Recently, work in [1] has demonstrated that for an important class of problems, EM exhibits linear ...

متن کامل

Manifold Regularized Discriminative Neural Networks

Unregularized deep neural networks (DNNs) can be easily overfit with a limited sample size. We argue that this is mostly due to the disriminative nature of DNNs which directly model the conditional probability (or score) of labels given the input. The ignorance of input distribution makes DNNs difficult to generalize to unseen data. Recent advances in regularization techniques, such as pretrain...

متن کامل

Manifold regularized deep neural networks

Deep neural networks (DNNs) have been successfully applied to a variety of automatic speech recognition (ASR) tasks, both in discriminative feature extraction and hybrid acoustic modeling scenarios. The development of improved loss functions and regularization approaches have resulted in consistent reductions in ASR word error rates (WERs). This paper presents a manifold learning based regulari...

متن کامل

Resource allocation for statistical QoS guarantees in MIMO cellular networks

This work considers the performance of the downlink channel of MIMO cellular networks serving multiple users with different statistical QoS requirements. The paper proposes resource allocation algorithms that aim to optimize the system performance over the sum of the optimal user utility functions by employing the effective capacity theory. Proportionally fair resource allocation among the user...

متن کامل

BitNet: Bit-Regularized Deep Neural Networks

We present a novel regularization scheme for training deep neural networks. The parameters of neural networks are usually unconstrained and have a dynamic range dispersed over the real line. Our key idea is to control the expressive power of the network by dynamically quantizing the range and set of values that the parameters can take. We formulate this idea using a novel end-to-end approach th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2021

ISSN: ['1879-2782', '0893-6080']

DOI: https://doi.org/10.1016/j.neunet.2021.04.034